Talk by David Klein of Audience, Inc. Given to the Redwood Center for Theoretical Neuroscience at UC Berkeley.
Our computing devices are outfitted with more sensors and dedicated sensor processors than ever before, spanning audio, image, motion, touch, climate, and more. Meanwhile, the technology industry is rapidly adopting sensor processing strategies based on learning algorithms, rather than pure human design. Many of these algorithms are inspired to varying degree by research in biological sensory learning. From the perspective of a computational neuroscientist and engineer with ten years of experience in bio-inspired technology development, this talk will address the role of sensory learning in several next-generation products. Mostly focusing on audio, image, and motion sensors, these products span the algorithmic domains of source separation, enhancement, detection, recognition, compression, and context awareness. The algorithmic and implementational problems that must be solved will be addressed, with an eye also toward how basic research can inform and benefit from this growing wave.